A Kullback-Leibler Divergence for Bayesian Model Diagnostics.
نویسندگان
چکیده
This paper considers a Kullback-Leibler distance (KLD) which is asymptotically equivalent to the KLD by Goutis and Robert [1] when the reference model (in comparison to a competing fitted model) is correctly specified and that certain regularity conditions hold true (ref. Akaike [2]). We derive the asymptotic property of this Goutis-Robert-Akaike KLD under certain regularity conditions. We also examine the impact of this asymptotic property when the regularity conditions are partially satisfied. Furthermore, the connection between the Goutis-Robert-Akaike KLD and a weighted posterior predictive p-value (WPPP) is established. Finally, both the Goutis-Robert-Akaike KLD and WPPP are applied to compare models using various simulated examples as well as two cohort studies of diabetes.
منابع مشابه
Comparison of Kullback-Leibler, Hellinger and LINEX with Quadratic Loss Function in Bayesian Dynamic Linear Models: Forecasting of Real Price of Oil
In this paper we intend to examine the application of Kullback-Leibler, Hellinger and LINEX loss function in Dynamic Linear Model using the real price of oil for 106 years of data from 1913 to 2018 concerning the asymmetric problem in filtering and forecasting. We use DLM form of the basic Hoteling Model under Quadratic loss function, Kullback-Leibler, Hellinger and LINEX trying to address the ...
متن کاملModel Confidence Set Based on Kullback-Leibler Divergence Distance
Consider the problem of estimating true density, h(.) based upon a random sample X1,…, Xn. In general, h(.)is approximated using an appropriate in some sense, see below) model fƟ(x). This article using Vuong's (1989) test along with a collection of k(> 2) non-nested models constructs a set of appropriate models, say model confidence set, for unknown model h(.).Application of such confide...
متن کاملOn the estimation and influence diagnostics for the zero–inflated Conway–Maxwell–Poisson regression model: A full Bayesian analysis
In this paper we develop a Bayesian analysis for the zero-inflated regression models based on the COM-Poisson distribution. Our approach is based on Markov chain Monte Carlo methods. We discuss model selection, as well as, develop case deletion influence diagnostics for the joint posterior distribution based on the ψ-divergence, which has several divergence measures as particular cases, such as...
متن کاملNumerical Evaluation of Information
Information theoretic measures are of paramount importance in Bayesian inference. Yet, their direct application in practice has been somewhat limited by the severe computational diiculties encountered in complex problems. In this paper we discuss implementation strategies for fast numerical computations of Entropies and Kullback-Leibler divergences that are relevant to Bayesian inference and de...
متن کاملAn Introduction to Inference and Learning in Bayesian Networks
Bayesian networks (BNs) are modern tools for modeling phenomena in dynamic and static systems and are used in different subjects such as disease diagnosis, weather forecasting, decision making and clustering. A BN is a graphical-probabilistic model which represents causal relations among random variables and consists of a directed acyclic graph and a set of conditional probabilities. Structure...
متن کاملKullback-Leibler Divergence for the Normal-Gamma Distribution
We derive the Kullback-Leibler divergence for the normal-gamma distribution and show that it is identical to the Bayesian complexity penalty for the univariate general linear model with conjugate priors. Based on this finding, we provide two applications of the KL divergence, one in simulated and one in empirical data.
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Open journal of statistics
دوره 1 3 شماره
صفحات -
تاریخ انتشار 2011